450 research outputs found

    Burning in the management of heathlands of Erica ciliaris and Erica tetralix: effects on structure and diversity

    Get PDF
    Can controlled burning be used as a management tool of Erica ciliaris and Erica tetralix wet heathlands? Two E. ciliaris and E. tetralix communities were selected and two 5 x 5 m plots were established in each. These were then characterised on the basis of frequency and cover values and plant species composition. They were subjected to experimental burning, after which the plots were sampled twice a year during the following four and a half years. The results show that the cover of woody species very quickly attained the values of the Control Plots. Diversity and species composition did not suffer notable changes during this period, however, temporal heterogeneity indicates that the main changes occur in the first 18 months of secondary succession. The multivariate analysis showed that the samples registered during this time were grouped as a function of the cover values of the species, which shows that stages exist in the vegetation recovery of these communities. The damage produced by fire in the community is minor, as a rapid recovery of the vegetation was observed, so controlled burning is a useful tool in the management of these heathlands

    A distributed multi-disciplinary design optimization benchmark test suite with constraints and multiple conflicting objectives

    Get PDF
    Collaborative optimization (CO) is an architecture within the multi-disciplinary design optimization (MDO) paradigm that partitions a constrained optimization problem into system and subsystem problems, with couplings between them. Multi-objective CO has multiple objectives at the system level and inequality constraints at the subsystem level. Whilst CO is an established technique, there are currently no scalable, constrained benchmark problems for multi-objective CO. In this study, we extend recent methods for generating scalable MDO benchmarks to propose a new benchmark test suite for multi-objective CO that is scalable in disciplines and variables, called `CO-ZDT'. We show that overly-constraining the number of generations in each iteration of the system-level optimizer leads to poor consistency constraint satisfaction. Increasing the number of subsystems in each of the problems leads to increasing system-level constraint violation. In problems with two subsystems, we find that convergence to the global Pareto front is very sensitive to the complexity of the landscape of the original non-decomposed problem. As the number of subsystems increases, convergence issues are encountered even for the simpler problem landscapes

    Collaborative Multi-Objective Optimization for Distributed Design of Complex Products

    Get PDF
    Multidisciplinary design optimization problems with competing objectives that involve several interacting components can be called complex systems. Nowadays, it is common to partition the optimization problem of a complex system into smaller subsystems, each with a subproblem, in part because it is too difficult to deal with the problem all-at-once. Such an approach is suitable for large organisations where each subsystem can have its own (specialised) design team. However, this requires a design process that facilitates collaboration, and decision making, in an environment where teams may exchange limited information about their own designs, and also where the design teams work at different rates, have different time schedules, and are normally not co-located. A multiobjective optimization methodology to address these features is described. Subsystems exchange information about their own optimal solutions on a peer-to-peer basis, and the methodology enables convergence to a set of optimal solutions that satisfy the overall system. This is demonstrated on an example problem where the methodology is shown to perform as well as the ideal, but “unrealistic” approach, that treats the optimization problem all-at-once

    Collaborative Multi-Objective Optimization for Distributed Design of Complex Products

    Get PDF
    Multidisciplinary design optimization problems with competing objectives that involve several interacting components can be called complex systems. Nowadays, it is common to partition the optimization problem of a complex system into smaller subsystems, each with a subproblem, in part because it is too difficult to deal with the problem all-at-once. Such an approach is suitable for large organisations where each subsystem can have its own (specialised) design team. However, this requires a design process that facilitates collaboration, and decision making, in an environment where teams may exchange limited information about their own designs, and also where the design teams work at different rates, have different time schedules, and are normally not co-located. A multiobjective optimization methodology to address these features is described. Subsystems exchange information about their own optimal solutions on a peer-to-peer basis, and the methodology enables convergence to a set of optimal solutions that satisfy the overall system. This is demonstrated on an example problem where the methodology is shown to perform as well as the ideal, but “unrealistic” approach, that treats the optimization problem all-at-once

    Application Domain Study of Evolutionary Algorithms in Optimization Problems

    Get PDF
    ABSTRACT This paper deals with the problem of comparing and testing evolutionary algorithms, that is, the benchmarking problem, from an analysis point of view. A practical study of the application domain of four representative evolutionary algorithms is carried out using a relevant set of real-parameter function optimization benchmarks. The four selected algorithms are the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) and the Differential Evolution (DE), due to their successful results in recent studies, a Genetic Algorithm with real parameter operators, used here as a reference approach because it is probably the most familiar to researchers, and the Macroevolutionary algorithm (MA), which is not widely known but it shows a very remarkable behavior in some problems. The algorithms have been compared running several tests over the benchmark function set to analyze their capabilities from a practical point of view, in other words, in terms of their usability. The characterization of the algorithms is based on accuracy, stability and time consumption parameters thus establishing their operational scope and the type of optimization problems they are more suitable for

    ON THE ANALYSIS OF TURBULENT FLOW SIGNALS BY ARTIFICIAL NEURAL NETWORKS AND ADAPTIVE TECHNIQUES

    Get PDF
    ABSTRACT Artificial Neural Networks (ANNs) and evolution are applied to the analysis of turbulent signals. In a first instance, a new trainable delay based artificial neural network is used to analyze Hot Wire Anemometer (HW) signals obtained at different positions within the wake of a circular cylinder with Reynolds number values ranging from 2000 to 8000. Results show that these networks are capable of performing accurate short term predictions of the turbulent signal. In addition, the ANNs can be set in a long term prediction mode resulting in a sort of non linear filter able to extract the features having to do with the larger eddies and coherent structures. In a second stage these networks are used to reconstruct a regularly sampled signal straight from the irregularly sampled one provided by a Laser Doppler Anemometer (LDA). The irregular sampling dynamics of the LDA signals is governed by the arrival of the seeding particles, superimposing the already complex turbulent signal characteristics. To cope with this complexity, an evolutionary based strategy is used to perform an adaptive and continuous online training of the ANNs. This approach permits obtaining a regularly sampled signal not by interpolating the original one, as it is often done, but by modeling it

    The broad iron Kalpha line of Cygnus X-1 as seen by XMM-Newton in the EPIC-pn modified timing mode

    Full text link
    We present the analysis of the broadened, flourescent iron Kalpha line in simultaneous XMM-Newton and RXTE data from the black hole Cygnus X-1. The XMM-Newton data were taken in a modified version of the timing mode of the EPIC-pn camera. In this mode the lower energy threshold of the instrument is increased to 2.8 keV to avoid telemetry drop outs due to the brightness of the source, while at the same time preserving the signal-to-noise ratio in the Fe Kalpha band. We find that the best-fit spectrum consists of the sum of an exponentially cut off power-law and relativistically smeared, ionized reflection. The shape of the broadened Fe Kalpha feature is due to strong Compton broadening combined with relativistic broadening. Assuming a standard, thin accretion disk, the black hole is close to rotating maximally.Comment: Astron. Astrophys., in pres

    Discriminative capacity and construct validity of the Clock Drawing Test in Mild Cognitive Impairment and Alzheimer's disease

    Get PDF
    OBJECTIVES: The aim of this study was to analyze the psychometric and diagnostic properties of the Clock Drawing Test (CDT), scored according to the Babins, Rouleau, and Cahn scoring systems, for Mild Cognitive Impairment (MCI) and Alzheimer's disease (AD) screening, and develop corresponding cutoff scores. Additionally, we assessed the construct validity of the CDT through exploratory and confirmatory factor analysis. METHODS: We developed a cross-sectional study of ambulatory MCI and AD patients, divided in two clinical groups (450 MCI and 250 mild AD patients) and a normal control group (N = 400). All participants were assessed with the CDT, Mini-Mental State Examination (MMSE) and Montreal Cognitive Assessment (MoCA) for convergent validity. RESULTS: The selected scoring systems presented adequate validity and reliability values. The proposed cutoff scores showed 60 to 65% sensitivity and 58 to 62% specificity to identify MCI patients. The corresponding values for AD were 84 to 90% sensitivity and 76 to 78% specificity. Exploratory and confirmatory factor analysis revealed that the Babins scoring system had good construct validity and allowed us to propose a three-factor model for this system. CONCLUSIONS: Our results confirmed the complexity of the CDT and support it as a cognitive screening instrument particularly sensitive to AD. The use of the CDT with MCI patients should be interpreted with more caution due to the lower sensitivity and specificity for milder forms of cognitive impairment.info:eu-repo/semantics/publishedVersio
    corecore